Entropic Graph-based Posterior Regularization for Learning Probabilistic Models

نویسندگان

  • Maxwell W. Libbrecht
  • Michael M. Hoffman
  • Jeffrey A. Bilmes
  • William S. Noble
چکیده

The advent of high-throughput DNA sequencing methods has led to an explosion in the availability of genome-wide data and a corresponding opportunity to use computational methods to derive insights into cellular function. A predominant form of statistical model used for genomic data has been temporal, such as the hidden Markov model (HMM) (Rabiner, 1989) or dynamic Bayesian network (DBN) (Dean and Kanazawa, 1988). Temporal models consist of a chain of random variables exhibiting the temporal Markov property, which asserts that “future” and “past” variables are independent given some notion of the present. This property enables exact inference to be performed using temporal dynamic programming. Most often, inference takes the form of a forward and then a backward pass along the chain. In genomics, the “temporal” axis is generally position along the genome rather than actual time.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropic Graph-based Posterior Regularization: Extended Version

Graph smoothness objectives have achieved great success in semi-supervised learning but have not yet been applied extensively to unsupervised generative models. We define a new class of entropic graph-based posterior regularizers that augment a probabilistic model by encouraging pairs of nearby variables in a regularization graph to have similar posterior distributions. We present a three-way a...

متن کامل

Entropic Graph-based Posterior Regularization

Graph smoothness objectives have achieved great success in semi-supervised learning but have not yet been applied extensively to unsupervised generative models. We define a new class of entropic graph-based posterior regularizers that augment a probabilistic model by encouraging pairs of nearby variables in a regularization graph to have similar posterior distributions. We present a three-way a...

متن کامل

Posterior Regularization for Structured Latent Varaible Models

We present posterior regularization, a probabilistic framework for structured, weakly supervised learning. Our framework efficiently incorporates indirect supervision via constraints on posterior distributions of probabilistic models with latent variables. Posterior regularization separates model complexity from the complexity of structural constraints it is desired to satisfy. By directly impo...

متن کامل

Posterior Regularization for Structured Latent Variable Models

We present posterior regularization, a probabilistic framework for structured, weakly supervised learning. Our framework efficiently incorporates indirect supervision via constraints on posterior distributions of probabilistic models with latent variables. Posterior regularization separates model complexity from the complexity of structural constraints it is desired to satisfy. By directly impo...

متن کامل

Graphical Model Structure Learning with 1-Regularization

This work looks at fitting probabilistic graphical models to data when the structure is not known. The main tool to do this is `1-regularization and the more general group `1-regularization. We describe limited-memory quasi-Newton methods to solve optimization problems with these types of regularizers, and we examine learning directed acyclic graphical models with `1-regularization, learning un...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013